Wednesday, November 06, 2013
Rack Mounted Micro-turbines for Data Centers
Syracuse University has twelve 780 kW microturbines, but these are in a centralized facility, not distributed. The units are in two groups of three each on a separate power bus. Heat from the turbines is directed to absorption chillers for cooling the servers. Recovering energy from the waste heat would be much more difficult if the generators were distributed around the racks.
Wednesday, May 09, 2012
Rittal Data Centre Container
The RDCC uses "direct free cooling" to lower energy use. In cold weather the normal air conditioning is turned off and filtered air is circulated. Rittal point out that in Canberra the outside air temperature is low enough to be used for direct cooling most of the time.
The unit has standard "twistlock" points on the lower corners for securing in transport and shackles on the top corners for lifting. However, the unit will require special handling as it is 3 m wide, which is wider than the standard 2.438 m ISO Shipping Container.
Rittal argue that the larger container is more practical, particularity where they are used for personnel. However, it would be wasteful to take a unit designed for equipment and use it for personnel. It would more efficient to have one container for the equipment and a lower cost transportable building for personnel.
Monday, December 26, 2011
CSIRO Requires a Modular Data Centre

ATM ID: CSIRORFT2011-064
Agency: CSIRO
Category: 43190000 - Communications Devices and Accessories
Close Date & Time: 27-Jan-2012 2:00 pm (ACT Local time) ...
Description:
CSIRO is seeking an innovative Data Centre solution that can readily adapt to dynamic Data Centre space, power and cooling requirements over the life of the facility. The overall solution must provide a complete turnkey solution inclusive of all site infrastructures necessary to function as a standalone Data Centre and will be assessed on its ability to be delivered as a prefabricated, preassembled and pretested suite of modules. ...
From: Request for Tender for a Modular Data Centre, CSIRO, 22-Dec-2011
Friday, August 13, 2010
Small Modular Data Centres
Before the conference presentations I looked at some of the products on display. SGI have their shipping container data centres. However, these are a bit large for the average Australian organisation's data centre requirements. Two products on a better scale are GreenEdgeData's shipping container product and B&R Data Systems B&R AusRack wit a CoolDoor. The "cool door" has the cooling system built into a standard rack cabinet.
The combination of these technologies allows for a data centre as small as one 20 foot shipping container. For larger installations they can provide multiple modules with not just the racks for computer equipment, but also office space for the staff. Lex Brasell, Technical Director of GreenEdgeData mentioned they had designed a whole modular data centre in about a dozen containers. Some customers see these as a way to save hanivg to build a building. But in practice it still makes more sense to put the containers inside a warehouse type building. I suggested that Lex might like to commission one of Australia's architects to design a tensile fabric roof for the containers. Another option would be a corrugated steel shed, similar to that designed by Glenn Murcutt for Lerida Estate Winery. The idea would not only to be practical (to protect the equipment in the containers) but also to provide an attractive appearance for the facility, so it does not just look like a container port.
As well as permanent data centres, modular systems can also be used for relocatable data centres. A major user of these are the military. As an example, the US military will be carrying containerised facilities on board their Austrlaian designed Joint High Speed Vessel and the Australian Defence Force on their two Australia class Landing Helicopter Dock ships (LHD).
Tuesday, February 23, 2010
Transportable Data Centre for Broadcaster and Bulldozer Company
Friday, December 18, 2009
IBM Portable Modular Data Center for WA
The IBM Rear Door Heat Exchangers replace the usual perforated metal doors of a standard equipment rack with a water colled unit. The hot air for the back of the equipment is cooled as it leaves the rack. This contrasts with the approach of APC and other vendors, who cool the hot air behind the rack. The IBM approach would increase the complexity of the installation, with pluming full of cold water on a moving door of each equipment rack.
The data centre is claimed to be "portable", but IBM talks of also a concrete slab on which the containers, generator and chiller will be installed. It is not clear how the system could be easily portable if it needs a concrete slab to be laid. A system using screw piles with twistlocks, which attach to the standard ISO container connectors, would seem to make more sense.
One point not made clear is why WesTrack would need such a large portable data centre. All of the data processing for a modern medium sized company would fit in a couple of equipment racks about the size of a filing cabinet. If the equipment is intended to support customers online, then there is no need for the equipment to be portable, or to be located in a remote area, as it could be as easily located anywhere in the world with Internet access. It is difficult to see the need for this much data centre capacity in an isloated location not connected to the Internet.
SYDNEY, Australia - 17 Dec 2009: IBM (NYSE: IBM) today announced that WesTrac Pty Ltd, an industrial machinery supplier headquartered in Perth, has selected IBM to design and implement a Portable Modular Data Center (PMDC) solution to provide the company with a flexible, cost-effective data centre to meet its immediate business needs as well as support future IT growth.
Faced with the need for additional data centre capacity fuelled by a major IT project and unable to secure more space in its own data centre or through traditional co-location with data centre operators in Perth, WesTrac turned to IBM. With tight project deadlines, WesTrac selected IBM's PMDC as the right solution offering a compact, fully functional, high-density and highly protected data centre, housed within two 6.1 metre customised shipping containers. The IBM solution, due for completion in February, will allow WesTrac to avoid the cost associated and time and space required with building a new facility.
Further:
- The PMDC will be customised to meet WesTrac's specific requirements and can support multiple technology vendors and multiple systems in an industry standard rack environment.
- The portability of the PMDC and its fully insulated and sealed containers means that it can be shipped and deployed into any environment and can be easily moved or relocated to any of WesTrac's Australian locations or where needed.
- The PMDC solution provides a scalable platform. Should additional capacity be required in the future, WesTrac can easily expand the PMDC solution by extending to multiple containers.
- The PMDC has the flexibility to be reprovisioned as a disaster recovery facility or as a live-live facility at any of WesTrac s locations in WA, NSW or the ACT.
"After assessing solutions from other vendors, WesTrac is pleased to select IBM to implement a scalable, flexible and portable data centre facility," said Mark Curtis, Communications Infrastructure Manager, WesTrac.
"This agreement provides us with a complete solution and, most importantly, enables all IT equipment to be easily serviced and maintained from within a closed, physically secure and environmentally tight container. All managed and delivered by IBM, WesTrac will benefit from temporary hosting during transitioning stages, project financing, and ultimately, permanent IT accommodation."
"IBM is delighted to work with WesTrac to design and deliver a PMDC solution to provide them with a quickly delivered, cost-effective and flexible data centre alternative," said David Yip, Site and Facilities Services Business Executive, IBM Australia. "The PMDC offering, part of the IBM Data Center Family of modular solutions, is designed as a flexible option for companies requiring remote or temporary data centre capacity to support their business growth."
WesTrac's PMDC solution will consist of two containers, one purpose built for IT equipment, using IBM Rear Door Heat Exchanger cooling doors for the most efficient cooling solution and overhead cooling and the other for services infrastructure including uninterruptible power supply (UPS) and batteries, chiller unit, cooling fan coils and electrical and mechanical distribution gear and a configured 400kVA engine generator.
Further, IBM will also purpose-build a concrete slab on which the PMDC containers, generator and second chiller unit will be installed. An early warning fire detection system, fire suppression system, fingerprint access system and video surveillance provide the required security for the solution.
The agreement was signed in December 2009.
About WesTrac
WesTrac is one of the largest Caterpillar dealerships in the world, servicing the territories of Western Australia, New South Wales, The Australian Capital Territory and Northern China. Established in 1989, WesTrac® is a wholly owned subsidiary company of Australian Capital Equity, which is owned by Kerry Stokes. WesTrac offers total support for customers at every stage of their Equipment Management Cycle. The comprehensive solution offers a wide choice of equipment options, parts, servicing and maintenance support, that is amongst the best in the industry. ...
From: "WesTrac Selects IBM's Portable Modular Data Center" , Media Release, IBM, 17 Dec 2009
Wednesday, August 19, 2009
Datapod Shipping Container Sized Modular Data Center Components
Two significant differences in the Datapod system are that it does not use a custom cooling system, nor standard shipping containers. Datapod use APC's hot aisle technology, (as used by Canberra Data Center)with two rows of back to back racks in a module. This allows for the easy installation of equipment, with readily available components. Local technicians will be familiar with this system and be able to support and expand it.
Instead of using standard welded side steel shipping containers, lightweight insulated removable panel are used for the walls of the Datapod modules. This allows for the sides to be opened up for access. This does limit the shipping and placement options for the system compared to those from other vendors, but this should not be a problem in real systems.
In theory, the IBM and Sun shipping container data centres could be stacked with other cargo and transported on the deck of a container ship. They could also be installed outdoors, relying on the weatherproof container to provide protection. Such systems are favoured for use by the military in harsh conditions.
However, it is unlikely a container full of millions of dollars of computer equipment is going to receive rough handling during transport, or be operated outdoors. Even the military are likely to transport the containers within a ship, such as the new Joint High Speed Vessel (JHSV), not on the deck.
It is very unlikely that a company or government agency is going to simply dump data center containers in their car park and wire them up. Instead a building will be built to house the equipment. The building need be little more than a shed, to protect the equipment from the elelements and provide physical security. Standard modualrised bulding components can be errect such a building quickly and cheapely.
Wednesday, May 13, 2009
Green Data Centre In Canberra
The facility uses APC’s "Hot Aisle" system. Two rows of computer racks are placed back to back with a polycarbonate roof and doors at either end enclosing the hot air. Coolers are placed at intervals in the racks, drawing the hot air, cooling it and supply it to the front of the racks. The coolers are supplied with chilled water from a central plant. The result is that the cooling is supplied to where it is needed, making the system more efficient and more flexible.
The APC system has all power, data and the cooling supplied from above. There is no need for a false floor. Pods can be devoted to a particular client and even isolated with a wall where security requires. Smaller clients rent racks in a shared pod.
The central chiller plant has multiple units and an insulated tank to hold a supply of cold water. This allows the load on the chillers to be balanced and a backup supply of chilled water if the units have to be shut down (or mains power is lost).
The APC pods have battery backup, to keep servers running until the multiple diesel generators start to supply power. But this full backup power is expensive. I suggested to CDC that there would be scope for using the resiliency features of the web and the power saving of modern servers to provide clients with a lower cost option. Web servers could be programmed to reduce their power use during a mains failure, by lowering their serving rate. The customers using this option could be charged a lower rental rate, as they would be making less use of the backup power. Those customers with an alternate server at another location could rely on that server taking the load. Otherwise, a well designed web application would automatically provide the essential information (such as the text) and delay delivering non-essential information (such as graphics).
The CDC facility provides an alternative to companies and government agencies building their own facilities. All but the largest agencies would have difficulty meeting the stringent requirements for such facilities and the increasingly stringent additional environmental requirements. The power to the CDC's pods is separately metered, allowing the customers to each be charged for electricity used (as well as for cooling). This would assist with carbon emission reporting and also to show the power savings.
Government agencies can rationalise their computing equipment by placing it in such a facility. But it would be unfortunate of they simply took a lot of old inefficient equipment and put it in a new centre. Agencies need to look at rationalising the number of servers they use and the efficiency of their existing equipment.
A more difficult task would then be for the Government to rationalise IT use between agencies. There seems no good reason why dozens of agencies run their own web servers, records management, financial and human resource systems. As these systems become server applications with web interfaces and with web user interfaces, there is increased scope for rationalisation. If government systems were rationalised in this way only a few data centres the size of the CDC facility would be needed to service all of the federal government's requirements.
Monday, April 13, 2009
Google battery backed shipping container server
The batteries used seem to be gel sealed lead acid units. Presumably these are designed to last the life of the server and not be replaced individually.
The Google server looks logically designed. My only slight worry is that the unit pictured looks like a DIY prototype, not a finished product which tens of thousands of are made. The article is dated 1 April and Google have previously produced April fools day jokes. But there seem to be other independent postings reporting on the design as well, from a data centre efficiency summit.
Friday, July 18, 2008
Palletized Computer Data Warehouse
Palletized Data Warehouse (PDW)
The PDW would combine the space saving features of rack mounting computers and low cost of industrial pallet equipment. Rack mounted equipment would be fixed to standard ISO pallets. These pallets would then be stacked in a warehouse, using a fork lift truck.
Webbing straps, as used in deployable military command centres would be used to fix the equipment to pallets. This would allow standard racks to be used and provide some flexibility, to allow for vibration during transport.
The modules would be assembled and tested, before being shipped to the site and plugged in. The modules would be sized to be compatible with standard industrial pallet handling equipment for ease of transport. Small vans could be used for transport, along with aircraft. The pallets could be loaded into standard ISO shipping containers for long distance transport. Individual pallets could be moved by one person with a simple hand cart and fit through a standard door and into a passenger lift.
A low cost industrial pallet rack warehouse could be used as a data centre. Equipment modules would be tested at ground level, then stacked 15m high into standard pallet racks, using fork lift trucks. Lighting and air conditioning would be hung from the ceiling, with cabling snaking down the racks, using standard industrial fittings. There would be no expensive false floor, or office quality fittings, just a sealed concrete floor. Heavy air and power conditioning equipment would be pallet mounted at ground level for fast installation and maintenance.
Staff would wear overalls and hard hats, and be trained to use safety harnesses when servicing the elevated equipment. The open design would allow for easy re-cabling and service. For any major service work, a module would be removed from the pallet rack using a fork lift truck and returned to the ground level maintenance area.
The temperature in the building would be allowed to fluctuate more than in a traditional data centre, to reduce air conditioning costs. The open design of the building would allow good air circulation for cooling. In may locations the ambient temperature would be sufficient to cool the building most of the year, with just fans needed, not air conditioning, nor complex fluid based cooling systems.
The palletized data warehouse would use much less floor space than a conventional data center and be quick to build using standardized prefabricated warehouse building modules. The data center could be finished on the outside to blend in with office buildings, or with inexpensive steel cladding in an industrial park. It would also be easier to service and take less space than an ISO containerized data center.
Data centre in a shipping container from Sun, IBM and HP
- Cooling: Densely packed rack mounted equipment is difficult to keep cool. Placing it in a cramped metal box will make this worse. Rack mounted equipment is usually designed to draw cool in air from the front and exhaust hot air out the back. This assumes there is a isle at the front and back for the air to circulate; a false floor underneath for the cool air to be delivered and space above the cabinets to carry the hot air away. An ISO shipping container is too small to do this in and most of the designs use only one isle down the middle with racks up against the side of the container. Photos of the Sun system show what appear to be very large cooling air ducts coming out of the front, which have to be ducted somewhere. Other units show doors in non standard places and lots of cables coming out of holes in the containers.
- Maintenance: The isle at the front and back of racks not only allows air to circualte, it also also provides space for maintenance workers to exchange equipment and run cables (there are a lot of cables in a data center). The width of an ISO container only allows for one narrow isle, making maintenance difficult.
- Delivery: Rack mounted cabinets are designed to fit in the back of a small truck or plane. There are trucks with special suspension designed to carry sensitive computer equipment. Only a few specialist cargo aircraft are large enough to carry an ISO container, so the boxes would have to long distances by sea, road or rail. The sea, road and rail transport systems designed to handle ISO shipping containers are not intended for delicate equipment and do not protect containers from the elements. The data center would need to be very well sealed for transport to prevent water damage and be sturdy enough top prevent damage from vibration, knocks and being tilted. The containers need to have enough room in them for staff to install and maintain the equipment, so about one third to one half of each container is empty, resulting in increased shipping costs.
Installation: Rack mounted cabinets are designed to fit trough a space about the size and shape of a standing person, so they can be pushed through a normal doorway and into a passenger lift, using a simple handcart. The equipment is therefore compatible with office buildings. In contrast shipping containers require a very large fork lift truck to move them and will not fit in an ordinary office building. They would need a specially designed warehouse-like building or annex to a building. ISO shipping containers are designed to be weatherproof, but setting up a datacenter outdoors would require all of the conduits to be carefully sealed and make maintenance very difficult, as containments would enter every time a door was opened. There have been many modular building systems based on ISO containers which have failed due to leaks. Having a container crammed with sensitive electrical equipment in a leaky steel box would be disastrous. Also the average corporation does not want to have something which looks like a container wharf or an electricity substation, next to their office building. The plan for a major data center in Canberra is in jeopardy due to opposition to the collocated power station. A containerized data center is likely to draw planning objections.
- Safety: Data center equipment is designed to be maintained with the power switched on. Staff need to be able to replace one computer in a rack, while the rest of the equipment keeps working. Working in a cramped metal box will be far less safe than a traditional data center. There will be less room for the staff to work and the walls will form one sealed electrically conductive box. Noise from the equipment is likely to be higher than in a normal room. As the box is designed to be sealed, it will need to have vents added to allow for fire fighting. If inert gas firefighting is used, it will be deadlier than in a conventional room and there will be fewer escape exits. Staff may have less than a minute to escape before being killed by the fire suppression system.
An alternative o the shipping container data center, I suggested some time ago, is a pallet data warehouse. With this the computers would be mounted on standard shipping pallets. The pallets would be simply placed on the floor. For very high density installations they could be stacked in a warehouse-like building using small fork lift trucks. Pallets are designed to fit in small trucks and aircraft . Smaller ISO pallets are designed to fit through a doorway and in a passenger lift. ISO pallets are designed to fit in ISO shipping containers and so these could be used for transport, with additional protective packing around the pallets. If needed, a shipping container data center could be build by wiring up the palletized equipment in a container.
Wednesday, December 19, 2007
Datacenter in a shipping container
For a secure freestanding structure, it might be better to use one of the modular concrete buildings designed for railway trackside electrical equipment. One of these from Garard was displayed at the Australian Rail Conference Exhibition 2007. These buildings are about the size of a shipping container made from one continuous piece of reinforced concrete. They have the advantage of having been designed to meet government security standards. The buildings can be made on site, or delivered on a truck (or train) pre-wired with the equipment installed. Because they are made of one piece of concrete, they are very secure and less likely to leak. It may also be possible to design one which would fit a shipping container inside. In that case the concrete building could be built on site or delivered empty, and then the shipping container full of computers simply slid inside.
It should be noted that shipping container data centers will not necessarily be a good use of space. The containers are narrow and will only have room for two rows of rack mounted cabinets, with a walkway between. There will only be access to the front of the cabinet, with no access to the back, making maintenance difficult. In most cases it will be better to use a larger room which can provide better access. If space is at a premium and a large data center is needed, then a pallet warehouse could be used (I suggested this to the Chinese government in 2003).
Also before investing in a new data center, an organization should conduct an inventory of its current data and processing requirements. In most cases it will be found that more efficient use of applications can be used to reduce the data and processing requirements, so that a smaller data center can be used, reducing the cost, space and energy use. Use of efficient XML based data storage and Web 2 applications can greatly reduce the needs of the organization for storage and processing.
Instead of virtualizing inefficient PC desktop applications, they can be replaced with properly engineered efficient applications designed to run remotely over a data link. This could reduce the processing requirements between ten and one hundred times. As an example, an organization which would have needed one of Sun's shipping container data centers, could instead downsize to one rack mount computer, the size of a four drawer filing cabinet. Apart from being one hundredth the size and use one hundredth the power, this would cost about one hundredth as much to buy.
Outsourcing the data storage or processing to a location with more space and power can also be considered, but not necessarily as far away as Iceland. The Canberra Technology City (CTC) is a proposed large data center for government and company use in Canberra, with its own power station.
Of course, alongside the shipping container data center will be needed a shipping container cafe, for the workers. ;-)
Wednesday, December 12, 2007
Australian Rail Conference Exhibition 2007
Australian Rail Conference Exhibition. This is held in conjunction with a conference, which costs money. But like many such events, the exhibition is open to anyone from business for free. There were a number of computer and telecommunications exhibits to justify my attendance, but it was really just an excuse to look at train stuff. ;-)
Some items of interest:
Thales Australia are expanding out from Defence equipment into transport, particularly rail systems. As an example they are supplying the Communications and Surveillance Subsystem (CSS) and to perform the Information and Communications Technology (ICT) System Integration (SI) for the Sydney Suburban Passenger Vehicle Public Private Partnership (PPP) Project (ie: computers and telecommunications for Sydney trains).
Ultimate Australia Transportation Equipment Pty Ltd
China South Locomotive and Rolling Stock Industry (Group) Corporation were one of several companies from China with cumbersome names selling locomotives and other railway products. They each seemed to have some form of high speed passenger train on offer as well as freight locomotives. I was unable to get their web site to work in English.
Garard were offering monolithic concrete shelters for equipment. These are buildings about the size of a shipping container made from one continuous piece of reinforced concrete. They are used to hold electrical equipment for railways, but could make very secure computer rooms. The buildings can be made on site, or delivered on a truck (or train) pre-wired with the equipment installed. Because they are made of one piece of concrete, they are very secure and less likely to leak.
Open Access displayed their Wireless Announcer. This is the wireless Emergency Warning and Intercommunication (EWIS) Alert system installed in the Sydney CBD for the APEC meeting. Unit with antennas, digital radio, amplifier, loudspeakers and battery backup are mounted on poles around the city to warn in an emergency. Some units also have alphanumeric displays.
CRC for Rail Innovation, is an industry academic research collaboration. They are looking at:
Monday, August 20, 2007
Reducing Carbon Emissions from the ICT Industry
The ACS recommended:
- Extending the Energy Rating System to ICT equipment for domestic and commercial use
- Innovative technologies to reduce power consumption
- Carbon offsets to help offset the emissions being produced by ICT equipment used in the office
- Virtualisation to replace servers
- Disable screen savers and implement ‘sleep mode’ for inactive equipment.
Some themes I will be talking about at the Green CIO conference:
It will be interesting to see what the IT vendors have to offer. It will be a lot easier to get both commercial and home users to be greener if there are some new stuff they can buy, rather than just telling them to do more with less.
IT staff up to now have not made energy efficiency a priority because their bosses and clients have not seen it as one. You would have not got sacked if your data system used 10% more power than the industry average, but you would if it was 10% less reliable.
Because this has not been a priority for the customer it has not been priority for the IT industry, so we do not have standards and guidelines to help us reduce energy and know what is good. That is now being addressed by the ACS, with the carbon audit of ICT, policy and the Green ICT Group. IT professionals have an ethical obligation to act in the public interest, even if individual customers do not want us to.
There are opportunities for IT professionals to be part of the green message of their organization. This can involve industry, government and university. A good example is the Defence Department's recently announced "Defence Future Capability Technology Centre" (DFCTC) to work on defence research projects with industry and universities from mid 2008. Much of this research, such as that on explosives, electronic warfare and electromagnetics will be defence specific, but the other areas have commercial application.
Energy saving is an important issue for the military, each extra computer and telecommunications gadget has to be powered by batteries on the soldier's back or from diesel brought in by truck or ship. Reducing energy use can make room for more ammunition. The US Army is now placing an emphasis on solar power of bases to reduce fuel use.
But you can only do so much with ad hoc power saving methods. Sustainable development engineering strategies emphasize a need for an integrated approach to energy and materials saving. Business processes need to be redesigned. IT staff can work with specialists on an overall strategy. They can assist with a web site and other online facilities to get their message out to corporate employees. Include some real time graphs showing company green performance. IT professionals ca act preemptively by proposing to use environmental standards.
The CIO can start by implementing simple measures, such as setting screen savers to reduce power, switching off unused systems automatically after hours. They can propose to implement environmental standards and provide information to staff online. But reliably economic operation must remain the priority. Environmental efficiency must come after that.
Some of the low hanging fruit are to use XML web based technology to make your applications more efficient so that a larger center is not needed. Consider outsourcing the data center to large specialists who have economies of scale.
IT professionals can help in educating other staff and providing online services which reduce travel, hardware use and the like.
It is unlikely the Government will regulate in the IT area. But the industry should get in first and implement its own guidelines and standards anyway.
Sustainable ICT can be incorporated into your strategic planning goals and targets. Strategies which will provide financial saving to offset the costs are more likely to gain corporate approaval.
Much of this will have to be a DIY effort for a few years until there are consultants and companies trained up to help. There are training materials being designed and these can be incorporated into online programs, such as the ACS Computer Professional Education Program.
The IT industry needs to look at their organisation and see if they practice what they preach, so they will be credibly be able to offer advice, products and services. IT has had a clean and efficient image. In environmental terms it is in reality a dirty wasteful business. That is a reality we have to change.